14 research outputs found
Minimum Density Hyperplanes
Associating distinct groups of objects (clusters) with contiguous regions of
high probability density (high-density clusters), is central to many
statistical and machine learning approaches to the classification of unlabelled
data. We propose a novel hyperplane classifier for clustering and
semi-supervised classification which is motivated by this objective. The
proposed minimum density hyperplane minimises the integral of the empirical
probability density function along it, thereby avoiding intersection with high
density clusters. We show that the minimum density and the maximum margin
hyperplanes are asymptotically equivalent, thus linking this approach to
maximum margin clustering and semi-supervised support vector classifiers. We
propose a projection pursuit formulation of the associated optimisation problem
which allows us to find minimum density hyperplanes efficiently in practice,
and evaluate its performance on a range of benchmark datasets. The proposed
approach is found to be very competitive with state of the art methods for
clustering and semi-supervised classification
Recommended from our members
Weighted sparse simplex representation: a unified framework for subspace clustering, constrained clustering, and active learning
AbstractSpectral-based subspace clustering methods have proved successful in many challenging applications such as gene sequencing, image recognition, and motion segmentation. In this work, we first propose a novel spectral-based subspace clustering algorithm that seeks to represent each point as a sparse convex combination of a few nearby points. We then extend the algorithm to a constrained clustering and active learning framework. Our motivation for developing such a framework stems from the fact that typically either a small amount of labelled data are available in advance; or it is possible to label some points at a cost. The latter scenario is typically encountered in the process of validating a cluster assignment. Extensive experiments on simulated and real datasets show that the proposed approach is effective and competitive with state-of-the-art methods.</jats:p
Spiking neural network training using evolutionary algorithms
Networks of spiking neurons can perform complex non-linear computations in fast temporal coding just as well as rate coded networks. These networks differ from previous models in that spiking neurons communicate information by the timing, rather than the rate, of spikes. To apply spiking neural networks on particular tasks, a learning process is required. Most existing training algorithms are based on unsupervised Hebbian learning. In this paper, we investigate the performance of the parallel differential evolution algorithm, as a supervised training algorithm for spiking neural networks. The approach was successfully tested on well-known and widely used classification problems
Cell-nuclear data reduction and prognostic model selection in bladder tumor recurrence
Objective The paper aims at improving the prediction of superficial bladder recurrence. To this end, feedforward neural networks (FNNs) and a feature selection method based on unsupervised clustering, were employed. Material and methods A retrospective prognostic study of 127 patients diagnosed with superficial urinary bladder cancer was performed. Images from biopsies were digitized and cell nuclei features were extracted. To design FNN classifiers, different training methods and architectures were investigated. The unsupervised k-windows (UKW) and the fuzzy c-means clustering algorithms were applied on the feature set to identify the most informative feature subsets. Results UKW managed to reduce the dimensionality of the feature space significantly, and yielded prediction rates 87.95% and 91.41%, for non-recurrent and recurrent cases, respectively. The prediction rates achieved with the reduced feature set were marginally lower compared to the ones attained with the complete feature set. The training algorithm that exhibited the best performance in all cases was the adaptive on-line backpropagation algorithm. Conclusions FNNs can contribute to the accurate prognosis of bladder cancer recurrence. The proposed feature selection method can remove redundant information without a significant loss in predictive accuracy, and thereby render the prognostic model less complex, more robust, and hence suitable for clinical use